Semi-Supervised Minimum Error Entropy Principle with Distributed Method
نویسندگان
چکیده
منابع مشابه
Decision Trees Using the Minimum Entropy-of-Error Principle
Binary decision trees based on univariate splits have traditionally employed so-called impurity functions as a means of searching for the best node splits. Such functions use estimates of the class distributions. In the present paper we introduce a new concept to binary tree design: instead of working with the class distributions of the data we work directly with the distribution of the errors ...
متن کاملThe Minimum Entropy Production Principle
It seems intuitively reasonable that Gibbs' variational principle de termining the conditions of heterogeneous equilibrium can be gener alized to nonequilibrium conditions. That is, a nonequilibriurn steady state should be the one that makes some kind of generalized-entropy production stationary; and even in the presence of irreversible fluxes, the condition for migrational equilibrium should...
متن کاملSemi-Supervised Learning with Sparse Distributed Representations
For many machine learning applications, labeled data may be very difficult or costly to obtain. For instance in the case of speech analysis, the average annotation time for a one hour telephone conversation transcript is 400 hours.[7] To circumvent this problem, one can use semi-supervised learning algorithms which utilize unlabeled data to improve performance on a supervised learning task. Sin...
متن کاملQuantized Minimum Error Entropy Criterion
Comparing with traditional learning criteria, such as mean square error (MSE), the minimum error entropy (MEE) criterion is superior in nonlinear and non-Gaussian signal processing and machine learning. The argument of the logarithm in Renyis entropy estimator, called information potential (IP), is a popular MEE cost in information theoretic learning (ITL). The computational complexity of IP is...
متن کاملA minimum relative entropy principle for AGI
In this paper the principle of minimum relative entropy (PMRE) is proposed as a fundamental principle and idea that can be used in the field of AGI. It is shown to have a very strong mathematical foundation, that it is even more fundamental then Bayes rule or MaxEnt alone and that it can be related to neuroscience. Hierarchical structures, hierarchies in timescales and learning and generating s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2018
ISSN: 1099-4300
DOI: 10.3390/e20120968